976 research outputs found

    Self-Adaptive resource allocation for event monitoring with uncertainty in Sensor Networks

    Get PDF
    Event monitoring is an important application of sensor networks. Multiple parties, with different surveillance targets, can share the same network, with limited sensing resources, to monitor their events of interest simultaneously. Such a system achieves profit by allocating sensing resources to missions to collect event related information (e.g., videos, photos, electromagnetic signals). We address the problem of dynamically assigning resources to missions so as to achieve maximum profit with uncertainty in event occurrence. We consider timevarying resource demands and profits, and multiple concurrent surveillance missions. We model each mission as a sequence of monitoring attempts, each being allocated with a certain amount of resources, on a specific set of events that occurs as a Markov process. We propose a Self-Adaptive Resource Allocation algorithm (SARA) to adaptively and efficiently allocate resources according to the results of previous observations. By means of simulations we compare SARA to previous solutions and show SARA’s potential in finding higher profit in both static and dynamic scenarios

    Progressive damage assessment and network recovery after massive failures

    Get PDF
    After a massive scale failure, the assessment of damages to communication networks requires local interventions and remote monitoring. While previous works on network recovery require complete knowledge of damage extent, we address the problem of damage assessment and critical service restoration in a joint manner. We propose a polynomial algorithm called Centrality based Damage Assessment and Recovery (CeDAR) which performs a joint activity of failure monitoring and restoration of network components. CeDAR works under limited availability of recovery resources and optimizes service recovery over time. We modified two existing approaches to the problem of network recovery to make them also able to exploit incremental knowledge of the failure extent. Through simulations we show that CeDAR outperforms the previous approaches in terms of recovery resource utilization and accumulative flow over time of the critical service

    Enabling discoverable trusted services for highly dynamic decentralized workflows

    Get PDF
    Fifth generation (5G) mobile networks will revolutionize edge-based computing by providing fast and reliable network capabilities to remote sensors, devices and microservices. This heralds new opportunities for researchers, allowing remote instrumentation and analytic capabilities to be as accessible as local resources. The increased availability of remote data and services presents new opportunities for collaboration, yet introduces challenges for workflow orchestration, which will need to adapt to consider an increased choice of available services, including those from trusted partners and the wider community. In this paper we outline a workflow approach that provides decentralized discovery and orchestration of verifiably trustable services in support of multi-party operations. We base this work on the adoption of standardised data models and protocols emerging from hypermedia research, which has demonstrated success in using combinations of Linked Data, Web of Things (WoT) and semantic technologies to provide mechanisms for autonomous goal-directed agents to discover, execute and reuse new heterogeneous resources and behaviours in large-scale, dynamic environments. We adopt Verifiable Credentials (VCs) to securely share information amongst peers based on prior service usage in a cryptographically secure and tamperproof way, providing a trust-based framework for ratifying service qualities. Collating these new service description channels and integrating with existing decentralized workflow research based on vector symbolic architecture (VSA) provides an enhanced semantic search space for efficient and trusted service discovery that will be necessary for 5G edge-computing environments

    On Exploiting Transient Social Contact Patterns for Data Forwarding in Delay-Tolerant Networks

    Full text link

    Trustable service discovery for highly dynamic decentralized workflows

    Get PDF
    The quantity and capabilities of smart devices and sensors deployed as part of the Internet of Things (IoT) and accessible via remote microservices is set to rise dramatically as the provision of interactive data streaming increases. This introduces opportunities to rapidly construct new applications by interconnecting these microservices in different workflow configurations. The challenge is to discover the required microservices, including those from trusted partners and the wider community, whilst being able to operate robustly under diverse networking conditions. This paper outlines a workflow approach that provides decentralized discovery and orchestration of verifiably trustable services in support of multi-party operations. The approach is based on adoption of patterns from self-sovereign identity research, notably Verifiable Credentials, to share information amongst peers based on attestations of service descriptions and prior service usage in a privacy preserving and secure manner. This provides a dynamic, trust-based framework for ratifying and evaluating the qualities of different services. Collating these new service descriptions and integrating with existing decentralized workflow research based on vector symbolic architecture (VSA) provides an enhanced semantic search space for efficient and trusted service discovery that is necessary to support a diverse range of emerging edge-computing environments. An architecture for a dynamic decentralized service discovery system, is designed, and described through application to a scenario which uses trusted peers’ reported experiences of an anomaly detection service to determine service selection

    Enabling discoverable trusted services for highly dynamic decentralized workflows

    Get PDF
    Fifth generation (5G) mobile networks will revolutionize edge-based computing by providing fast and reliable network capabilities to remote sensors, devices and microservices. This heralds new opportunities for researchers, allowing remote instrumentation and analytic capabilities to be as accessible as local resources. The increased availability of remote data and services presents new opportunities for collaboration, yet introduces challenges for workflow orchestration, which will need to adapt to consider an increased choice of available services, including those from trusted partners and the wider community. In this paper we outline a workflow approach that provides decentralized discovery and orchestration of verifiably trustable services in support of multi-party operations. We base this work on the adoption of standardised data models and protocols emerging from hypermedia research, which has demonstrated success in using combinations of Linked Data, Web of Things (WoT) and semantic technologies to provide mechanisms for autonomous goal-directed agents to discover, execute and reuse new heterogeneous resources and behaviours in large-scale, dynamic environments. We adopt Verifiable Credentials (VCs) to securely share information amongst peers based on prior service usage in a cryptographically secure and tamperproof way, providing a trust-based framework for ratifying service qualities. Collating these new service description channels and integrating with existing decentralized workflow research based on vector symbolic architecture (VSA) provides an enhanced semantic search space for efficient and trusted service discovery that will be necessary for 5G edge-computing environments

    Financial Characteristics of Companies Audited by Large Audit Firms

    Get PDF
    Purpose “ The purpose of this paper is to examine how financial characteristics associated with the choice of a big audit firm with further investigation on the agency costs of free cash flows.Design/methodology/approach “ The sample used for this work includes industrial listed companies from Germany and France. To test our hypothesis, we used a number of logit models, extending the standard model selection audit firm, to include the variables of interest. Following previous work, our dependent dummy variable is Big4 or non-Big4.Findings “ We observed that most independent variables in the German companies show similar results to previous work, but we did not have the same results for the French industry. Moreover, our findings suggest that the total debt and dividends can be an important reason for determining the choice of a large audit firm, reducing agency costs of free cash flows.Research limitations/implications “ This study has some limitations on the measurements of the cost of the audit fees and also generates opportunities for additional searching.Originality/value “ The paper provides only one aspect to explain the relationship between the problems of agency costs of free cash flow and influence in choosing a large auditing firm, which stems from investors\u27 demand for higher quality audits

    Globalization and the Transmission of Social Values: The Case of Tolerance

    Full text link

    National trends in total cholesterol obscure heterogeneous changes in HDL and non-HDL cholesterol and total-to-HDL cholesterol ratio : a pooled analysis of 458 population-based studies in Asian and Western countries

    Get PDF
    Background: Although high-density lipoprotein (HDL) and non-HDL cholesterol have opposite associations with coronary heart disease, multi-country reports of lipid trends only use total cholesterol (TC). Our aim was to compare trends in total, HDL and nonHDL cholesterol and the total-to-HDL cholesterol ratio in Asian and Western countries. Methods: We pooled 458 population-based studies with 82.1 million participants in 23 Asian and Western countries. We estimated changes in mean total, HDL and non-HDL cholesterol and mean total-to-HDL cholesterol ratio by country, sex and age group. Results: Since similar to 1980, mean TC increased in Asian countries. In Japan and South Korea, the TC rise was due to rising HDL cholesterol, which increased by up to 0.17 mmol/L per decade in Japanese women; in China, it was due to rising non-HDL cholesterol. TC declined in Western countries, except in Polish men. The decline was largest in Finland and Norway, at similar to 0.4 mmol/L per decade. The decline in TC in most Western countries was the net effect of an increase in HDL cholesterol and a decline in non-HDL cholesterol, with the HDL cholesterol increase largest in New Zealand and Switzerland. Mean total-to-HDL cholesterol ratio declined in Japan, South Korea and most Western countries, by as much as similar to 0.7 per decade in Swiss men (equivalent to similar to 26% decline in coronary heart disease risk per decade). The ratio increased in China. Conclusions: HDL cholesterol has risen and the total-to-HDL cholesterol ratio has declined in many Western countries, Japan and South Korea, with only a weak correlation with changes in TC or non-HDL cholesterol.Peer reviewe

    Optimasi Portofolio Resiko Menggunakan Model Markowitz MVO Dikaitkan dengan Keterbatasan Manusia dalam Memprediksi Masa Depan dalam Perspektif Al-Qur`an

    Full text link
    Risk portfolio on modern finance has become increasingly technical, requiring the use of sophisticated mathematical tools in both research and practice. Since companies cannot insure themselves completely against risk, as human incompetence in predicting the future precisely that written in Al-Quran surah Luqman verse 34, they have to manage it to yield an optimal portfolio. The objective here is to minimize the variance among all portfolios, or alternatively, to maximize expected return among all portfolios that has at least a certain expected return. Furthermore, this study focuses on optimizing risk portfolio so called Markowitz MVO (Mean-Variance Optimization). Some theoretical frameworks for analysis are arithmetic mean, geometric mean, variance, covariance, linear programming, and quadratic programming. Moreover, finding a minimum variance portfolio produces a convex quadratic programming, that is minimizing the objective function ðð¥with constraintsð ð 𥠥 ðandð´ð¥ = ð. The outcome of this research is the solution of optimal risk portofolio in some investments that could be finished smoothly using MATLAB R2007b software together with its graphic analysis
    corecore